65 research outputs found

    Quantum circuit fidelity estimation using machine learning

    Full text link
    The computational power of real-world quantum computers is limited by errors. When using quantum computers to perform algorithms which cannot be efficiently simulated classically, it is important to quantify the accuracy with which the computation has been performed. In this work we introduce a machine-learning-based technique to estimate the fidelity between the state produced by a noisy quantum circuit and the target state corresponding to ideal noise-free computation. Our machine learning model is trained in a supervised manner, using smaller or simpler circuits for which the fidelity can be estimated using other techniques like direct fidelity estimation and quantum state tomography. We demonstrate that, for simulated random quantum circuits with a realistic noise model, the trained model can predict the fidelities of more complicated circuits for which such methods are infeasible. In particular, we show the trained model may make predictions for circuits with higher degrees of entanglement than were available in the training set, and that the model may make predictions for non-Clifford circuits even when the training set included only Clifford-reducible circuits. This empirical demonstration suggests classical machine learning may be useful for making predictions about beyond-classical quantum circuits for some non-trivial problems.Comment: 27 pages, 6 figure

    The GENIE Neutrino Monte Carlo Generator: Physics and User Manual

    Get PDF
    GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of its physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included

    Quantum Machine Learning in High Energy Physics

    Full text link
    Machine learning has been used in high energy physics since a long time, primarily at the analysis level with supervised classification. Quantum computing was postulated in the early 1980s as way to perform computations that would not be tractable with a classical computer. With the advent of noisy intermediate-scale quantum computing devices, more quantum algorithms are being developed with the aim at exploiting the capacity of the hardware for machine learning applications. An interesting question is whether there are ways to combine quantum machine learning with High Energy Physics. This paper reviews the first generation of ideas that use quantum machine learning on problems in high energy physics and provide an outlook on future applications.Comment: 25 pages, 9 figures, submitted to Machine Learning: Science and Technology, Focus on Machine Learning for Fundamental Physics collectio

    Neural network accelerator for quantum control

    Full text link
    Efficient quantum control is necessary for practical quantum computing implementations with current technologies. Conventional algorithms for determining optimal control parameters are computationally expensive, largely excluding them from use outside of the simulation. Existing hardware solutions structured as lookup tables are imprecise and costly. By designing a machine learning model to approximate the results of traditional tools, a more efficient method can be produced. Such a model can then be synthesized into a hardware accelerator for use in quantum systems. In this study, we demonstrate a machine learning algorithm for predicting optimal pulse parameters. This algorithm is lightweight enough to fit on a low-resource FPGA and perform inference with a latency of 175 ns and pipeline interval of 5 ns with  > ~>~0.99 gate fidelity. In the long term, such an accelerator could be used near quantum computing hardware where traditional computers cannot operate, enabling quantum control at a reasonable cost at low latencies without incurring large data bandwidths outside of the cryogenic environment.Comment: 7 pages, 10 figure
    • …
    corecore